Recurrent Neural Networks

Reference

Why recurrent

RNNs are called recurrent because they perform the same task for every element of a sequence, with the output being depended on the previous computations.
Another way to think about RNNs is that they have a “memory” which captures information about what has been calculated so far.

Architecture and components

assets/images/RNN-1.png|400

h(t)=f(Ux(t)+Wh(t1))

Forward pass

Backward pass

The gradient computation involves performing a forward propagation pass moving left to right through the graph shown above followed by a backward propagation pass moving right to left through the graph.

RNNs & Memory

RNN-10.png|400

Pros & Cons